Sparse Graph Attention Networks
نویسندگان
چکیده
Graph Neural Networks (GNNs) have proved to be an effective representation learning framework for graph-structured data, and achieved state-of-the-art performance on many practical predictive tasks. Among the variants of GNNs, Attention (GATs) improve graph tasks through a dense attention mechanism. However, real-world graphs are often very large noisy, GATs prone overfitting if not regularized properly. In this paper, we propose Sparse (SGATs) that learn sparse coefficients under L0-norm regularization, learned attentions then used all GNN layers, resulting in edge-sparsified graph. By doing so, can identify noisy/task-irrelevant edges, thus perform feature aggregation most informative neighbors. Extensive experiments synthetic (assortative disassortative) benchmarks demonstrate superior SGATs. Furthermore, removed edges interpreted intuitively quantitatively. To best our knowledge, is first algorithm shows significant redundancies achieve similar (on assortative graphs) or sometimes higher disassortative performances than original graphs. Our code available at https://github.com/Yangyeeee/SGAT.
منابع مشابه
Graph Attention Networks
We present graph attention networks (GATs), novel neural network architectures that operate on graph-structured data, leveraging masked self-attentional layers to address the shortcomings of prior methods based on graph convolutions or their approximations. By stacking layers in which nodes are able to attend over their neighborhoods’ features, we enable (implicitly) specifying different weight...
متن کاملGraph2Seq: Graph to Sequence Learning with Attention-based Neural Networks
Celebrated Sequence to Sequence learning (Seq2Seq) and its fruitful variants are powerful models to achieve excellent performance on the tasks that map sequences to sequences. However, these are many machine learning tasks with inputs naturally represented in a form of graphs, which imposes significant challenges to existing Seq2Seq models for lossless conversion from its graph form to the sequ...
متن کاملEdge Attention-based Multi-Relational Graph Convolutional Networks
Graph convolutional network (GCN) is generalization of convolutional neural network (CNN) to work with arbitrarily structured graphs. A binary adjacency matrix is commonly used in training a GCN. Recently, the attention mechanism allows the network to learn a dynamic and adaptive aggregation of the neighborhood. We propose a new GCN model on the graphs where edges are characterized in multiple ...
متن کاملSparse Reliable Graph Backbones
Given a connected graph G and a failure probability pe for each edge e in G, the reliability of G is the probability that G remains connected when each edge e is removed independently with probability pe. In this paper it is shown that every n-vertex graph contains a sparse backbone, i.e., a spanning subgraph with O(n log n) edges whose reliability is at least (1−n−Ω(1)) times that ofG. Moreove...
متن کاملDense & Sparse Graph Partition
In a graph G = (V, E), the density is the ratio between the number of edges |E| and the number of vertices |V |. This criterion may be used to find communities in a graph: groups of highly connected vertices. We propose an optimization problem based on this criterion, the idea is to find the vertex partition that maximizes the sum of the densities of each class. We prove that this problem is NP...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Knowledge and Data Engineering
سال: 2021
ISSN: ['1558-2191', '1041-4347', '2326-3865']
DOI: https://doi.org/10.1109/tkde.2021.3072345